Unifying abstract inexact convergence theorems for descent methods and block coordinate variable metric iPiano

نویسنده

  • Peter Ochs
چکیده

An abstract convergence theorem for a class of descent method that explicitly models relative errors is proved. The convergence theorem generalizes and unifies several recent abstract convergence theorems, and is applicable to possibly non-smooth and non-convex lower semi-continuous functions that satisfy the Kurdyka– Lojasiewicz inequality, which comprises a huge class of problems. The descent property is measured with respect to a function that is allowed to change along the iterations, which makes block coordinate and variable metric methods amenable to the abstract convergence theorem. As a particularly algorithm, the convergence of a block coordinate variable metric version of iPiano (an inertial forward– backward splitting algorithm) is proved. The newly introduced algorithms perform favorable on an inpainting problem with a Mumford–Shah-like regularization from image processing.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method)

In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on a...

متن کامل

A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization

The block coordinate descent (BCD) method is widely used for minimizing a continuous function f of several block variables. At each iteration of this method, a single block of variables is optimized, while the remaining variables are held fixed. To ensure the convergence of the BCD method, the subproblem of each block variable needs to be solved to its unique global optimal. Unfortunately, this...

متن کامل

Inexact block coordinate descent methods with application to the nonnegative matrix factorization

This work is concerned with the cyclic block coordinate descent method, or nonlinear Gauss-Seidel method, where the solution of an optimization problem is achieved by partitioning the variables in blocks and successively minimizing with respect to each block. The properties of the objective function that guarantee the convergence of such alternating scheme have been widely investigated in the l...

متن کامل

A Comparison of Inexact Newton and Coordinate Descent Mesh Optimization Techniques

We compare inexact Newton and coordinate descent methods for optimizing the quality of a mesh by repositioning the vertices, where quality is measured by the harmonic mean of the mean-ratio metric. The effects of problem size, element size heterogeneity, and various vertex displacement schemes on the performance of these algorithms are assessed for a series of tetrahedral meshes.

متن کامل

Parallel Successive Convex Approximation for Nonsmooth Nonconvex Optimization

Consider the problem of minimizing the sum of a smooth (possibly non-convex) and a convex (possibly nonsmooth) function involving a large number of variables. A popular approach to solve this problem is the block coordinate descent (BCD) method whereby at each iteration only one variable block is updated while the remaining variables are held fixed. With the recent advances in the developments ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016